Перевод: со всех языков на все языки

со всех языков на все языки

Markov function

См. также в других словарях:

  • Markov decision process — Markov decision processes (MDPs), named after Andrey Markov, provide a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for… …   Wikipedia

  • Markov's inequality — gives an upper bound for the measure of the set (indicated in red) where f(x) exceeds a given level . The bound combines the level with the average value of f …   Wikipedia

  • Markov perfect equilibrium — A solution concept in game theory Relationships Subset of Subgame perfect equilibrium Significance Proposed by …   Wikipedia

  • Markov's principle — Markov s principle, named after Andrey Markov Jr, is a classical tautology that is not intuitionistically valid but that may be justified by constructive means. There are many equivalent formulations of Markov s principle. Contents 1 Statements… …   Wikipedia

  • Markov chain geostatistics — refer to the Markov chain models, simulation algorithms and associated spatial correlation measures (e.g., transiogram) based on the Markov chain random field theory, which extends a single Markov chain into a multi dimensional field for… …   Wikipedia

  • Markov Chain Monte Carlo — Verfahren (kurz MCMC Verfahren; seltener auch Markov Ketten Monte Carlo Verfahren) sind eine Klasse von Algorithmen, die Stichproben aus Wahrscheinlichkeitsverteilungen ziehen. Dies geschieht auf der Basis der Konstruktion einer Markow Kette,… …   Deutsch Wikipedia

  • Markov network — A Markov network, or Markov random field, is a model of the (full) joint probability distribution of a set mathcal{X} of random variables having the Markov property. A Markov network is similar to a Bayesian network in its representation of… …   Wikipedia

  • Markov logic network — A Markov logic network (or MLN) is a probabilistic logic which applies the ideas of a Markov network to first order logic, enabling uncertain inference. Markov logic networks generalize first order logic, in the sense that, in a certain limit,… …   Wikipedia

  • Markov chain — A simple two state Markov chain. A Markov chain, named for Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized …   Wikipedia

  • Markov chain Monte Carlo — MCMC redirects here. For the organization, see Malaysian Communications and Multimedia Commission. Markov chain Monte Carlo (MCMC) methods (which include random walk Monte Carlo methods) are a class of algorithms for sampling from probability… …   Wikipedia

  • Markov model — In probability theory, a Markov model is a stochastic model that assumes the Markov property. Generally, this assumption enables reasoning and computation with the model that would otherwise be intractable. Contents 1 Introduction 2 Markov chain… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»